High Accurate and a Variant of k-fold Cross Validation Technique for Predicting the Decision Tree Classifier Accuracy

نویسندگان

چکیده

In machine learning data usage is the most important criterion than logic of program. With very big and moderate sized datasets it possible to obtain robust high classification accuracies but not with small datasets. particular only large training are potential for producing decision tree results. The results obtained by using one testing dataset pair reliable. Cross validation technique uses many random folds same validation. order reliable statistically correct there a need apply algorithm on different pairs To overcome problem single existing k-fold cross plan obtaining increased accuracy this paper new called prime fold proposed experimentally tested thoroughly then verified correctly bench mark UCI It observed that based after experimentation far better techniques finding accuracies.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

The 'K' in K-fold Cross Validation

The K-fold Cross Validation (KCV) technique is one of the most used approaches by practitioners for model selection and error estimation of classifiers. The KCV consists in splitting a dataset into k subsets; then, iteratively, some of them are used to learn the model, while the others are exploited to assess its performance. However, in spite of the KCV success, only practical rule-of-thumb me...

متن کامل

A K-fold Averaging Cross-validation Procedure.

Cross-validation type of methods have been widely used to facilitate model estimation and variable selection. In this work, we suggest a new K-fold cross validation procedure to select a candidate 'optimal' model from each hold-out fold and average the K candidate 'optimal' models to obtain the ultimate model. Due to the averaging effect, the variance of the proposed estimates can be significan...

متن کامل

Efficient algorithms for decision tree cross-validation

Cross-validation is a useful and generally applicable technique often employed in machine learning, including decision tree induction. An important disadvantage of straightforward implementation of the technique is its computational overhead. In this paper we show that, for decision trees, the computational overhead of cross-validation can be reduced significantly by integrating the crossvalida...

متن کامل

Eecient Algorithms for Decision Tree Cross-validation

Cross-validation is a useful and generally applicable technique often employed in machine learning, including decision tree induction. An important disadvantage of straightforward implementation of the technique is its computational overhead. In this paper we show that, for decision trees, the computational overhead of cross-validation can be reduced signiicantly by integrating the cross-valida...

متن کامل

Estimators of Variance for K-Fold Cross-Validation

1 Motivations In machine learning, the standard measure of accuracy for models is the prediction error (PE), i.e. the expected loss on future examples. We consider here the i.i.d. regression or classification setups, where future examples are assumed to be independently sampled from the distribution that generated the training set. When the data distribution is unknown, PE cannot be computed. T...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: International journal of innovative technology and exploring engineering

سال: 2021

ISSN: ['2278-3075']

DOI: https://doi.org/10.35940/ijitee.c8403.0110321